1,307 research outputs found
A Quadratically Regularized Functional Canonical Correlation Analysis for Identifying the Global Structure of Pleiotropy with NGS Data
Investigating the pleiotropic effects of genetic variants can increase
statistical power, provide important information to achieve deep understanding
of the complex genetic structures of disease, and offer powerful tools for
designing effective treatments with fewer side effects. However, the current
multiple phenotype association analysis paradigm lacks breadth (number of
phenotypes and genetic variants jointly analyzed at the same time) and depth
(hierarchical structure of phenotype and genotypes). A key issue for high
dimensional pleiotropic analysis is to effectively extract informative internal
representation and features from high dimensional genotype and phenotype data.
To explore multiple levels of representations of genetic variants, learn their
internal patterns involved in the disease development, and overcome critical
barriers in advancing the development of novel statistical methods and
computational algorithms for genetic pleiotropic analysis, we proposed a new
framework referred to as a quadratically regularized functional CCA (QRFCCA)
for association analysis which combines three approaches: (1) quadratically
regularized matrix factorization, (2) functional data analysis and (3)
canonical correlation analysis (CCA). Large-scale simulations show that the
QRFCCA has a much higher power than that of the nine competing statistics while
retaining the appropriate type 1 errors. To further evaluate performance, the
QRFCCA and nine other statistics are applied to the whole genome sequencing
dataset from the TwinsUK study. We identify a total of 79 genes with rare
variants and 67 genes with common variants significantly associated with the 46
traits using QRFCCA. The results show that the QRFCCA substantially outperforms
the nine other statistics.Comment: 64 pages including 12 figure
Recommended from our members
Coil combination using linear deconvolution in k-space for phase imaging
Background: The combination of multi-channel data is a critical step for the imaging of phase and susceptibility contrast in magnetic resonance imaging (MRI). Magnitude-weighted phase combination methods often produce noise and aliasing artifacts in the magnitude images at accelerated imaging sceneries. To address this issue, an optimal coil combination method through deconvolution in k-space is proposed in this paper.
Methods: The proposed method firstly employs the sum-of-squares and phase aligning method to yield a complex reference coil image which is then used to calculate the coil sensitivity and its Fourier transform. Then, the coil k-space combining weights is computed, taking into account the truncated frequency data of coil sensitivity and the acquired k-space data. Finally, combining the coil k-space data with the acquired weights generates the k-space data of proton distribution, with which both phase and magnitude information can be obtained straightforwardly. Both phantom and in vivo imaging experiments were conducted to evaluate the performance of the proposed method.
Results: Compared with magnitude-weighted method and MCPC-C, the proposed method can alleviate the phase cancellation in coil combination, resulting in a less wrapped phase.
Conclusions: The proposed method provides an effective and efficient approach to combine multiple coil image in parallel MRI reconstruction, and has potential to benefit routine clinical practice in the future
Word Embedding based Correlation Model for Question/Answer Matching
With the development of community based question answering (Q&A) services, a
large scale of Q&A archives have been accumulated and are an important
information and knowledge resource on the web. Question and answer matching has
been attached much importance to for its ability to reuse knowledge stored in
these systems: it can be useful in enhancing user experience with recurrent
questions. In this paper, we try to improve the matching accuracy by overcoming
the lexical gap between question and answer pairs. A Word Embedding based
Correlation (WEC) model is proposed by integrating advantages of both the
translation model and word embedding, given a random pair of words, WEC can
score their co-occurrence probability in Q&A pairs and it can also leverage the
continuity and smoothness of continuous space word representation to deal with
new pairs of words that are rare in the training parallel text. An experimental
study on Yahoo! Answers dataset and Baidu Zhidao dataset shows this new
method's promising potential.Comment: 8 pages, 2 figure
No-compressing of quantum phase information
We raise a general question of quantum information theory whether the quantum
phase information can be compressed and retrieved. A general qubit contains
both amplitude and phase information, while an equatorial qubit contains only a
phase information. We study whether it is possible to compress the phase
information of n equatorial qubits into m general qubits with m being less than
n, and still those information can be retrieved perfectly. We prove that this
process is not allowed by quantum mechanics.Comment: 4 pages, 1 figur
Cooperative Network Synchronization: Asymptotic Analysis
Accurate clock synchronization is required for collaborative operations among nodes across wireless networks. Compared with traditional layer-by-layer methods, cooperative network synchronization techniques lead to significant improvement in performance, efficiency, and robustness. This paper develops a framework for the performance analysis of cooperative network synchronization. We introduce the concepts of cooperative dilution intensity (CDI) and relative CDI to characterize the interaction between agents, which can be interpreted as properties of a random walk over the network. Our approach enables us to derive closed-form asymptotic expressions of performance limits, relating them to the quality of observations as well as the network topology
Alchemical and structural distribution based representation for improved QML
We introduce a representation of any atom in any chemical environment for the
generation of efficient quantum machine learning (QML) models of common
electronic ground-state properties. The representation is based on scaled
distribution functions explicitly accounting for elemental and structural
degrees of freedom. Resulting QML models afford very favorable learning curves
for properties of out-of-sample systems including organic molecules,
non-covalently bonded protein side-chains, (HO)-clusters, as well as
diverse crystals. The elemental components help to lower the learning curves,
and, through interpolation across the periodic table, even enable "alchemical
extrapolation" to covalent bonding between elements not part of training, as
evinced for single, double, and triple bonds among main-group elements
Randomized and efficient time synchronization in dynamic wireless sensor networks: a gossip-consensus-based approach
This paper proposes novel randomized gossip-consensus-based sync (RGCS) algorithms to realize efficient time correction in dynamic wireless sensor networks (WSNs). First, the unreliable links are described by stochastic connections, reflecting the characteristic of changing connectivity gleaned from dynamicWSNs. Secondly, based on the mutual drift estimation, each pair of activated nodes fully adjusts clock rate and offset to achieve network-wide time synchronization by drawing upon the gossip consensus approach. The converge-to-max criterion is introduced to achieve a much faster convergence speed. The theoretical results on the probabilistic synchronization performance of the RGCS are presented. Thirdly, a Revised-RGCS is developed to counteract the negative impact of bounded delays, because the uncertain delays are always present in practice and would lead to a large deterioration of algorithm performances. Finally, extensive simulations are performed on the MATLAB and OMNeT++ platform for performance evaluation. Simulation results demonstrate that the proposed algorithms are not only efficient for synchronization issues required for dynamic topology changes but also give a better performance in term of converging speed, collision rate, and the robustness of resisting delay, and outperform other existing protocols
- …